Market Roundup

April 16, 2004

Intel Announces New Itanium Processors, Discusses Future IA-64 Plans

Fighting Terrorism with IT

Microsoft and Micro Focus Announce Mainframe Migration Alliance

Making the Right Moves

 


Intel Announces New Itanium Processors, Discusses Future IA-64 Plans

By Charles King

Intel has introduced two new Itanium 2 processors that the company says are approximately 28% lower in price and up to 25% higher in performance than previous dual-processor Itanium servers. According to Intel, the new processors’ price/performance improvements represent the next step toward the company’s goal of delivering Itanium-based servers that deliver up to twice the performance of Intel Xeon solutions for the same system cost by 2007. The new Itanium 2 processor at 1.4GHz with 3MB of cache is currently available for $1,172 in 1,000-unit quantities. The Itanium 2 at 1.6GHz with 3MB of cache is expected to be available in May 2004 for $2,408 in 1,000-unit quantities. Dual-processor systems using the new processors will be available from vendors including Bull, Dell, Fujitsu Siemens, HCL, IBM, Kraftway, Lenovo, Maxdata, NEC, Samsung, and Transtec AG. According to news reports, Jason Waxman, Intel’s director of multiprocessor marketing, discussed the company’s plans to eliminate Itanium’s price premium by 2007. According to Waxman, these plans include expansion of the Itanium 2 product line to better suit the market, enabling Itanium servers to use memory and other components from the company’s Xeon product lines by 2005-6, and creating a common chipset that can work with both Itanium and Xeon processors by 2007.

Intel’s new Itanium 2 processors, along with upcoming plans for Itanium and Xeon, represent further public disclosure of plans to change the course of the company’s 64-bit computing strategy. The first intimations of this effort came in February, when Intel announced plans to develop Opteron-style 64-bit extensions capabilities for its Xeon products, and to deliver price parity between Xeon and Itanium by 2007. In a sense, these efforts acknowledge the serious mistake Intel made in pursuing a muscular, monolithic 64-bit computing platform in a market where a host of competing products already held sway. That approach could have worked to Intel’s advantage if Itanium had reached the market on schedule. But the high tech bust and a shifting IT market where flexibility means as much, or more, than muscle has made it difficult for Itanium to gain meaningful momentum, especially in the general-purpose server market Intel originally coveted. However, the company has succeeded before by turning apparently overwhelming challenges to its advantage. The sizeable bet Intel is making here is that by delivering a 64-bit processors at the same price as its well-regarded 32-bit Xeon solutions, even customers and ISVs with little, if any, need to upgrade to 64-bit solutions will flock to the Itanium bargain basement. To sweeten the pot, Intel will create dual-platform-friendly components and chipsets designed to simplify manufacturing processes and cut costs for both the company and its OEM partners.

Beyond trying to set its plans for enterprise computing domination onto a new set of tracks, Intel’s decision to pursue a volume pricing and development model for Itanium despite the lack of volume sales has repercussions for the rest of the IT industry. For the past two years, increasing commodity pressures in the high-end IT market have driven the prices and margins of server components downward. The result has been an increasingly tough environment that vendors with deep software and service offerings, such as IBM, have negotiated better than more hardware-centric companies like Sun. While virtually every major vendor is in the process of transforming itself into a services-driven organization, those efforts could be complicated if Intel’s new Itanium strategy gains momentum. Intel may be willing and able to afford commodity pricing of products that have yet to achieve commodity status. What the company’s strategy will cost its competitors remains to be seen.

Fighting Terrorism with IT

By Jim Balderston

During this past week’s hearings in front of the 9/11 Commission, officials from the FBI testified that they had repeatedly asked for funds from Congress to upgrade their existing IT infrastructure and were consistently denied that funding. FBI officials went so far as to describe their IT infrastructure as a “joke” that was widely known in the nation’s capital and in Congress, as well. Efforts to improve this situation dated as far back as 1998, officials testified. Many of the failures being uncovered by intelligence and law enforcement agencies prior to 9/11 are centered on the failure to share or disseminate pertinent information to the appropriate parties. Communications between intelligence agencies have been deemed poor, as well as within the agencies themselves. In one instance, FBI officials said they were forced to fax alerts to various field offices due to their poor IT infrastructure.

The FBI and the CIA have a long history of inter-agency acrimony, as one is tasked to exclusively handle domestic issues, and the other foreign. In some cases, there are disputes as to who has authority over an investigation, or control over the information needed to pursue investigations. As a result, both agencies have become stingy with intelligence. But as has been revealed in this week’s 9/11 Commission hearings, it seems that technological as well as cultural issues plague the FBI and CIA. Not only do the two agencies have trouble communicating with each other, they have problems communicating within their own domains. One might argue that creating such islands of information provides a unique level of security but it does not foster the ability to communicate effectively when events require it. Security by obscurity, taken to its logical conclusion, would require the complete dismantling of the network. What can be more secure, or more useless, than a series of unconnected and inaccessible chunks of information?

The private sector has learned a great deal about the value of sharing information via technological means. Through the increased adoption of standards-based technology, enterprises are realizing that they can make substantial gains by exposing more information to their customers, partners, and suppliers. They may risk exposing more information to their competitors in the process, but the value of informing and educating customers, partners, and suppliers far outstrips any potential downside. Such would be the case, to a significant degree, within and between both the FBI and the CIA. Their ability to share information externally and internally should be a key tool in combating terrorism both domestically and internationally in the coming years. In this light, the FBI’s requests for “upgrades” of their existing IT infrastructure seem to be insufficient. What we see as paramount is not only a change in culture, but an essential evolution in the way information is accessed and made accessible. Instead of having a goal of upgrading, we would argue the FBI and the CIA should be seeking ways to integrate their IT infrastructures based on standards, so that intelligence can be wrung from information through correlation and dissemination. Proper IT integration strategies can help this evolution, which in turn will help change the culture that prevented U.S. agencies from effectively “connecting the dots” before 9/11.

Microsoft and Micro Focus Announce Mainframe Migration Alliance

By Charles King

Microsoft and Micro Focus International have announced a new alliance to enable the migration of mainframe systems onto the Windows operating system using Microsoft .NET technology. The alliance aims to move application workloads from mainframe environments to Intel-based Windows Server platforms. According to the two companies, such migrations would help customers reduce the cost of maintaining and modernizing aging mainframe systems. Micro Focus’s Enterprise Server with its new Mainframe Transaction Option underpins this effort, and allows CICS/COBOL mainframe applications to be re-hosted onto Windows and then extended through the use of the .NET Framework, SQL Server (TM) 2000, XML, and Web services. No pricing or availability information was included in the announcement. In an unrelated event, Microsoft released four security patches to protect Windows users against twenty vulnerabilities, eight of which were listed as critical. All versions of Windows require patches, along with Exchange servers, IIS Web servers, Internet Explorer, Outlook, and Outlook Express.

Coming shortly after IBM celebrated the 40th anniversary of its venerable S/360 mainframe, the Microsoft/Micro Focus announcement offers some interesting food for thought. To begin, the two companies’ mainframe migration alliance is anything but unique. Both Sun and HP have attempted, with varying results, similar mainframe migration efforts. For that matter, Micro Focus has since 2001 focused on re-hosting COBOL applications on contemporary and emerging platforms including Linux, Java, UNIX, OS/400, and .NET. The attractiveness of such a business model is obvious. It is estimated that up to three-quarters of the world’s business data resides on mainframe systems, which translates into a notably fat pie no matter who is doing the slicing. Carving off even a modest piece of legacy mainframe business would result in a healthy repast for Micro Focus. Microsoft’s interest in the mainframe market is also obvious, considering the company’s ongoing efforts to promote itself as the enterprise platform of the future. Indeed, the continuing price/performance enhancements of Wintel solutions are likely to elicit the interest of select numbers of mainframe customers, and with good reason. The fact is that the days when the mainframe was the only enterprise computing game in town are long over. In addition, a goodly number of enterprises with small or even shrinking legacy mainframe requirements are likely to find less costly options, be they Wintel, UNIX, or OS/400, increasingly attractive.

The real question such users need to answer is how these options compare to their existing systems. Re-hosted Wintel solutions might provide users with low MIPs requirements acceptable levels of application performance, but as Microsoft’s security patches for twenty new vulnerabilities attest, the Wintel platform offers other issues for mainframe customers to consider. Wintel equivalents for robust mainframe external security management solutions such as IBM’s RAC-F and CA’s Top Secret do not exist. Nor do the user authentication and auditing tools that make mainframe environments particularly difficult for hackers to penetrate. Microsoft has repeatedly stated its dedication to correct ongoing security problems, but the sheer complexity of Windows platforms is likely to complicate and postpone any comprehensive fix. This does not make Wintel solutions completely unusable for mainframe re-hosting, but it does limit the areas where they would be appropriate. As attested by its new mid-market-focused eServer z890, IBM is following a strategy of developing lower-capacity solutions that fit the needs and expectations of smaller mainframe customers. But so long as those customers exist, competitors like Microsoft and Micro Focus will be waiting eagerly with plates and forks in hand.

Making the Right Moves

By Jim Balderston

Veritas and BEA announced this week that they have created an alliance in which they will jointly develop, market, and sell a product that combines the companies’ areas of expertise. BEA will provide its WebLogic and Tuxedo application servers for the effort while Veritas will provide its Indepth management software, OpForce server provisioning product, and Cluster Server high-availability solution. The companies said that their joint products will be standards-based and applicable to heterogeneous environments, and that they intend to position them for the utility computing market. The companies said they will roll out the new products in the second half of the year, when pricing will be announced.

For both BEA and Veritas, this move makes a great deal of sense. By joining with BEA, Veritas has the ability to move away a bit from their traditional position in Sun’s orbit. Considering Sun’s ongoing problems, this makes sense from both a tactical and strategic standpoint. BEA gets to add layers of IT management capability to its application server solutions, broadening their capabilities and, potentially, their market reach. Overall, by combining their traditional areas of strength, Veritas and BEA’s joint products should be attractive to potential customers who will enjoy dealing with what amounts to a single vendor that can claim real expertise in the two divergent but business critical areas of enterprise IT management.

But in a larger sense, this move plays hell in the market’s greater momentum toward utility computing. While IT systems vendors such as IBM and HP have focused a great deal of rhetorical firepower on this subject, the fact is that hardware’s role in utility computing is subservient to software. In such environments, applications become largely nomadic entities that can run from any number of hardware assets, sometimes moving on a regularly scheduled basis within a 24-hour period to meet a range of business demands. In such circumstances, the ability to manage, move, monitor, and manipulate applications across multiple platforms is more important than hardware and crucial to the successful delivery of utility computing services. We see BEA’s and Veritas’ joint venture as recognizing the simple fact that utility computing is really all about software. Since applications can reside in many and any places, the ability to deliver them when and as needed is the core of utility computing. In addition, the partnership offers a postulation that software vendors who regularly work with multiple hardware platforms can provide natural leadership for utility computing solutions. In our mind, the key differentiators in this market will be innovations in the software management and control, which will allow enterprises deploying utility computing environments to make the right moves and profit from them. Given Veritas and BEA’s past success in their separate bailiwicks, combining their talents should offer considerable opportunities for future success.


The Sageza Group, Inc.

32108 Alvarado Blvd #354

Union City, CA 94587

650·390·0700     fax 650·649·2302

London +44 (0) 20·7900·2819

Milan +39 02·9544·1646

 

sageza.com

 

Copyright © 2004 The Sageza Group, Inc. May not be duplicated or retransmitted without written permission.